Multilayer Perceptrons based on Fuzzy Flip- Flops

نویسندگان

  • Rita Lovassy
  • László T. Kóczy
  • László Gál
چکیده

The concept of fuzzy flip-flop was introduced in the middle of 1980’s by Hirota (with his students). The Hirota Lab recognized the essential importance of the concept of a fuzzy extension of a sequential circuit and the notion of fuzzy memory. From this point of view they proposed alternatives for “fuzzifying” digital flip-flops. The starting elementary digital units were the binary J-K flipflops. Their definitive equation was used both in the minimal disjunctive and conjunctive forms. As fuzzy connectives do not satisfy all Boolean axioms the fuzzy equivalents of these equations resulted in two non equivalent definitions, “reset and set type” fuzzy flip-flops (F 3 ), using the concepts of fuzzy negation, tnorm and t-conorm operations. Hirota et al. recognized that the reset and set equations cannot be easily used as elements of memory module because of their asymmetrical nature. In their 1988’s paper Ozawa, Hirota and Kóczy proposed a unified form of the fuzzy J-K flip-flop characteristic equation involving the reset and set characteristics, based on min-max and algebraic norms. A few years later the hardware implementation of these fuzzy flip-flop circuits in discrete and continuous mode was presented by the Hirota Lab. The hardware implementation of algebraic norm based F 3 pointed out the methodology benefit resulting from the deep mathematical relation of fuzzy systems and neural networks. The fuzzy flipflop was proposed as basic unit in fuzzy register circuits. The multilayer perceptron (MLP) is a widely used artificial neural network type. The MLP as function approximator is the object of study in different engineering applications as well as applied mathematics and computer science. We introduced Fuzzy Flip-Flop based Neural Networks (FNN), a multilayer perceptron based on fuzzy flip-flops. The proposed network is a structure consisting of the same type of fuzzy flip-flops. In the FNN the neurons have been substituted by fuzzy J-K flip-flops with feedback (in reality, a type of combinational circuit) and fuzzy D flip-flops derived from the fuzzy J-K F 3 based on frequently used fuzzy operations (e.g. algebraic, Łukasiewicz, Yager, Dombi, Hamacher, Frank and Dubois-Prade norms). Several types of fuzzy J-K and D flip-flop based on various fuzzy operations and standard complementation have been proposed. We determined their characteristic equations, illustrating and comparing their properties. It was shown that broadly they may be classified into two groups, one presenting quasi s-shape transfer characteristics and the rest having non-sigmoid character. The construction of a fuzzy neuron unit from fuzzy J-K and D flip-flops was proposed indicating a possible connection of this fuzzy unit with the artificial neuron, the basic component of neural networks. The main idea is to have such a fuzzy neural network structure which can be implemented simply in hardware. The fuzzy norms, furthermore simple fuzzy flipflop characteristic equations are not complicated in hardware realization. We proposed the FNN as a neural network with general-purpose hardware which could be more user-friendly; it is not bound to algorithmic a-priori-assumptions and therefore offers high flexibility. We defined a pair of new fuzzy intersection and union. The proposed connectives, called Trigonometric norms, consist of simple combinations of trigonometric functions. The basic motivation for constructing new norms was to have fuzzy flip-flops with sigmoid transfer characteristics in some particular cases. We compared the function approximation capability of FNNs with one and two hidden layers based on Dombi, Łukasiewicz and new Trigonometric norms. A special combination of the Levenberg-Marquardt (LM) method with the Bacterial Evolutionary Algorithm (BEA), the Bacterial Memetic Algorithm with Modified Operator Execution Order (BMAM) was proposed and applied for FNNs variables optimization and training. We showed that the FNNs function approximation accuracy depends not only on the network structures and parameters selected, on the hidden layer numbers, and on the hidden units, but is strongly influenced by the fuzzy flip-flop and operator type, furthermore their parameter values and the learning algorithm. Comparing the simulation results, we concluded, that Łukasiewicz type fuzzy D flip-flop neurons based NNs with two hidden layers performed best; they had the best generalization capability. Acknowledgments This work is partially supported by Óbuda University Grants, the project TÁMOP 421B, a Széchenyi István University Main Research Direction Grant, further the National Scientific Research Fund Grants OTKA K 75711, and K 105529.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimizing Fuzzy Flip-Flop Based Neural Networks by Bacterial Memetic Algorithm

In our previous work we proposed a Multilayer Perceptron Neural Networks (MLP NN) consisting of fuzzy flipflops (F3) based on various operations. We showed that such kind of fuzzy-neural network had good learning properties. In this paper we propose an evolutionary approach for optimizing fuzzy flip-flop networks (FNN). Various popular fuzzy operation and three different fuzzy flip-flop types w...

متن کامل

Applications of Fuzzy Program Graph in Symbolic Checking of Fuzzy Flip-Flops

All practical digital circuits are usually a mixture of combinational and sequential logic. Flip–flops are essential to sequential logic therefore fuzzy flip–flops are considered to be among the most essential topics of fuzzy digital circuit. The concept of fuzzy digital circuit is among the most interesting applications of fuzzy sets and logic due to the fact that if there has to be an ultimat...

متن کامل

Parameter optimisation in fuzzy flip-flop-based neural networks

This paper presents a method for optimizing the parameters of Multilayer Perceptron Neural Networks (MLP NN) consisting of fuzzy flip-flops (F3) based on various operations using Bacterial Memetic Algorithm with the Modified Operator Execution Order (BMAM). In early work, the authors proposed the gradient based Levenberg-Marquardt (LM) algorithm for variable optimization. The BMAM local and glo...

متن کامل

Acta Technica Jaurinensis - Vol. 4. No. 1. (2011.)

In this paper two types of neural networks, namely the “traditional” tansig based neural networks and the multilayer perceptrons based on fuzzy flipflops (FNN) trained by the Bacterial Memetic Algorithm with Modified Operator Execution Order (BMAM) are tested and compared on their robustness to test functions outliers. The robust design of the FNN is presented, and the best suitable fuzzy neuro...

متن کامل

Quasi Optimization of Fuzzy Neural Networks

The fuzzy flip-flop based multilayer perceptron, named Fuzzy Neural Network, FNN is proposed for function approximation. In recent years much effort has been made for the development of a special kind of bacterial memetic algorithm for optimization and training of the fuzzy neural network parameters. In this approach the FNN parameters have been encoded in a chromosome and participate in the ba...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012